36 resultados para Job Demands-Resources model

em CentAUR: Central Archive University of Reading - UK


Relevância:

50.00% 50.00%

Publicador:

Resumo:

This paper is concerned with the quantification of the likely effect of anthropogenic climate change on the water resources of Jordan by the end of the twenty-first century. Specifically, a suite of hydrological models are used in conjunction with modelled outcomes from a regional climate model, HadRM3, and a weather generator to determine how future flows in the upper River Jordan and in the Wadi Faynan may change. The results indicate that groundwater will play an important role in the water security of the country as irrigation demands increase. Given future projections of reduced winter rainfall and increased near-surface air temperatures, the already low groundwater recharge will decrease further. Interestingly, the modelled discharge at the Wadi Faynan indicates that extreme flood flows will increase in magnitude, despite a decrease in the mean annual rainfall. Simulations projected no increase in flood magnitude in the upper River Jordan. Discussion focuses on the utility of the modelling framework, the problems of making quantitative forecasts and the implications of reduced water availability in Jordan.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The prediction of climate variability and change requires the use of a range of simulation models. Multiple climate model simulations are needed to sample the inherent uncertainties in seasonal to centennial prediction. Because climate models are computationally expensive, there is a tradeoff between complexity, spatial resolution, simulation length, and ensemble size. The methods used to assess climate impacts are examined in the context of this trade-off. An emphasis on complexity allows simulation of coupled mechanisms, such as the carbon cycle and feedbacks between agricultural land management and climate. In addition to improving skill, greater spatial resolution increases relevance to regional planning. Greater ensemble size improves the sampling of probabilities. Research from major international projects is used to show the importance of synergistic research efforts. The primary climate impact examined is crop yield, although many of the issues discussed are relevant to hydrology and health modeling. Methods used to bridge the scale gap between climate and crop models are reviewed. Recent advances include large-area crop modeling, quantification of uncertainty in crop yield, and fully integrated crop–climate modeling. The implications of trends in computer power, including supercomputers, are also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An integrated approach to climate change impact assessment is explored by linking established models of regional climate (SDSM), water resources (CATCHMOD) and water quality (INCA) within a single framework. A case study of the River Kennet illustrates how the system can be used to investigate aspects of climate change uncertainty, deployable water resources, and water quality dynamics in upper and lower reaches of the drainage network. The results confirm the large uncertainty in climate change scenarios and freshwater impacts due to the choice of general circulation model (GCM). This uncertainty is shown to be greatest during summer months as evidenced by large variations between GCM-derived projections of future tow river flows, deployable yield from groundwater, severity of nutrient flushing episodes, and Long-term trends in surface water quality. Other impacts arising from agricultural land-use reform or delivery of EU Water Framework Directive objectives under climate change could be evaluated using the same framework. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this work was to construct a dynamic model of hepatic amino acid metabolism in the lactating dairy cow that could be parameterized using net flow data from in vivo experiments. The model considers 22 amino acids, ammonia, urea, and 13 energetic metabolites, and was parameterized using a steady-state balance model and two in vivo, net flow experiments conducted with mid-lactation dairy cows. Extracellular flows were derived directly from the observed data. An optimization routine was used to derive nine intracellular flows. The resulting dynamic model was found to be stable across a range of inputs suggesting that it can be perturbed and applied to other physiological states. Although nitrogen was generally in balance, leucine was in slight deficit compared to predicted needs for export protein synthesis, suggesting that an alternative source of leucine (e.g. peptides) was utilized. Simulations of varying glucagon concentrations indicated that an additional 5 mol/d of glucose could be synthesized at the reference substrate concentrations and blood flows. The increased glucose production was supported by increased removal from blood of lactate, glutamate, aspartate, alanine, asparagine, and glutamine. As glucose Output increased, ketone body and acetate release increased while CO2 release declined. The pattern of amino acids appearing in hepatic vein blood was affected by changes in amino acid concentration in portal vein blood, portal blood flow rate and glucagon concentration, with methionine and phenylalanine being the most affected of essential amino acids. Experimental evidence is insufficient to determine whether essential amino acids are affected by varying gluconeogenic demands. (C) 2004 Published by Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Restocking is a favoured option in supporting livelihoods after a disaster. With the depletion of local livestock populations, the introduction of new species and breeds will clearly affect biodiversity. Nevertheless, the impact of restocking on Animal Genetic Resources has been largely ignored. The aim of this paper, therefore, is to examine the consequences of restocking on biodiversity via a simple model. Utilising a hypothetical project based on cattle, the model demonstrates that more than one-third of the population was related to the original restocked animals after three generations. Under conditions of random breed selection, the figure declined to 20 per cent. The tool was then applied to a donor-led restocking project implemented in Bosnia-Herzegovina. By restocking primarily with Simmental cattle, the model demonstrated that the implementation of a single restocking project is likely to have accelerated the decline of the indigenous Busa breed by a further nine per cent. Thus, greater awareness of the long-term implications of restocking on biodiversity is required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Restocking is a favoured option in supporting livelihoods after a disaster. With the depletion of local livestock populations, the introduction of new species and breeds will clearly affect biodiversity. Nevertheless, the impact of restocking on Animal Genetic Resources has been largely ignored. The aim of this paper, therefore, is to examine the consequences of restocking on biodiversity via a simple model. Utilising a hypothetical project based on cattle, the model demonstrates that more than one-third of the population was related to the original restocked animals after three generations. Under conditions of random breed selection, the figure declined to 20 per cent. The tool was then applied to a donor-led restocking project implemented in Bosnia-Herzegovina. By restocking primarily with Simmental cattle, the model demonstrated that the implementation of a single restocking project is likely to have accelerated the decline of the indigenous Buşa breed by a further nine per cent. Thus, greater awareness of the long-term implications of restocking on biodiversity is required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. The habitat components determining the structure of bee communities are well known when considering foraging resources; however, there is little data with respect to the role of nesting resources. 2. As a model system this study uses 21 diverse bee communities in a Mediterranean landscape comprising a variety of habitats regenerating after fire. The findings clearly demonstrate that a variety of nesting substrates and nest building materials have key roles in organising the composition of bee communities. 3. The availability of bare ground and potential nesting cavities were the two primary factors influencing the structure of the entire bee community, the composition of guilds, and also the relative abundance of the dominant species. Other nesting resources shown to be important include availability of steep and sloping ground, abundance of plant species providing pithy stems, and the occurrence of pre-existing burrows. 4. Nesting resource availability and guild structure varied markedly across habitats in different stages of post-fire regeneration; however, in all cases, nest sites and nesting resources were important determinants of bee community structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Attitudes to floristics have changed considerably during the past few decades as a result of increasing and often more focused consumer demands, heightened awareness of the threats to biodiversity, information flow and overload, and the application of electronic and web-based techniques to information handling and processing. This paper will examine these concerns in relation to our floristic knowledge and needs in the region of SW Asia. Particular reference will be made to the experience gained from the Euro+Med PlantBase project for the preparation of an electronic plant-information system for Europe and the Mediterranean, with a single core list of accepted plant names and synonyms, based on consensus taxonomy agreed by a specialist network. The many challenges Ð scientific, technical and organisational Ð that it has presented will be discussed as well as the problems of handling nontaxonomic information from fields such as conservation, karyology, biosystematics and mapping. The question of regional cooperation and the sharing of efforts and resources will also be raised and attention drawn to the recent planning workshop held in Rabat (May 2002) for establishing a technical cooperation network for taxonomic capacity building in North Africa as a possible model for the SW Asia region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we list some new orthogonal main effects plans for three-level designs for 4, 5 and 6 factors in IS runs and compare them with designs obtained from the existing L-18 orthogonal array. We show that these new designs have better projection properties and can provide better parameter estimates for a range of possible models. Additionally, we study designs in other smaller run-sizes when there are insufficient resources to perform an 18-run experiment. Plans for three-level designs for 4, 5 and 6 factors in 13 to 17 runs axe given. We show that the best designs here are efficient and deserve strong consideration in many practical situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to organize distributed educational resources efficiently, to provide active learners an integrated, extendible and cohesive interface to share the dynamically growing multimedia learning materials on the Internet, this paper proposes a generic resource organization model with semantic structures to improve expressiveness, scalability and cohesiveness. We developed an active learning system with semantic support for learners to access and navigate through efficient and flexible manner. We learning resources in an efficient and flexible manner. We provide facilities for instructors to manipulate the structured educational resources via a convenient visual interface. We also developed a resource discovering and gathering engine based on complex semantic associations for several specific topics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Danish Eulerian Model (DEM) is a powerful air pollution model, designed to calculate the concentrations of various dangerous species over a large geographical region (e.g. Europe). It takes into account the main physical and chemical processes between these species, the actual meteorological conditions, emissions, etc.. This is a huge computational task and requires significant resources of storage and CPU time. Parallel computing is essential for the efficient practical use of the model. Some efficient parallel versions of the model were created over the past several years. A suitable parallel version of DEM by using the Message Passing Interface library (AIPI) was implemented on two powerful supercomputers of the EPCC - Edinburgh, available via the HPC-Europa programme for transnational access to research infrastructures in EC: a Sun Fire E15K and an IBM HPCx cluster. Although the implementation is in principal, the same for both supercomputers, few modifications had to be done for successful porting of the code on the IBM HPCx cluster. Performance analysis and parallel optimization was done next. Results from bench marking experiments will be presented in this paper. Another set of experiments was carried out in order to investigate the sensitivity of the model to variation of some chemical rate constants in the chemical submodel. Certain modifications of the code were necessary to be done in accordance with this task. The obtained results will be used for further sensitivity analysis Studies by using Monte Carlo simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large scientific applications are usually developed, tested and used by a group of geographically dispersed scientists. The problems associated with the remote development and data sharing could be tackled by using collaborative working environments. There are various tools and software to create collaborative working environments. Some software frameworks, currently available, use these tools and software to enable remote job submission and file transfer on top of existing grid infrastructures. However, for many large scientific applications, further efforts need to be put to prepare a framework which offers application-centric facilities. Unified Air Pollution Model (UNI-DEM), developed by Danish Environmental Research Institute, is an example of a large scientific application which is in a continuous development and experimenting process by different institutes in Europe. This paper intends to design a collaborative distributed computing environment for UNI-DEM in particular but the framework proposed may also fit to many large scientific applications as well.